![]() method, media validator and media deposit to detect tinting on a media item
专利摘要:
DYE DETECTION. A method of detecting tinting in a media item is described. The method comprises receiving an image of the media item, where the image comprises a plurality of pixels having different intensity values within a range of intensity values. Center weighting is applied to the received image to expand the central portion of the intensity value range. A limit is applied to each pixel in the centrally weighted image to transform each pixel into a binary value thereby creating an evaluation image comprising a plurality of pixels, each having two possible values. A difference image is created by comparing a pixel in the evaluation image with a pixel in a binary reference image at a corresponding spatial location, so that the difference image includes (i) a dye pixel at each spatial location in the which pixel in the evaluation image has a low intensity pixel and the corresponding pixel in the binary reference image has a high intensity pixel, and (ii) a non-dye pixel in all other spatial locations. The media item is identified as dyed in (...). 公开号:BR102012023646B1 申请号:R102012023646-0 申请日:2012-09-19 公开日:2020-12-01 发明作者:Ping Chen;Chao He;Gary Ross 申请人:Ncr Corporation; IPC主号:
专利说明:
[0001] The present invention relates to automated dye detection. Particularly, though not exclusively, the invention relates to the automated detection of tinting of a media item, such as a banknote, in a self-service terminal. Background of the invention [0002] Some ATMs, such as automatic teller machines (ATMs), can receive banknotes deposited by a customer. Some anti-theft systems include automatic dyeing of banknotes when a banknote cassette is removed, or otherwise accessed, by an unauthorized person. Such systems cause the cassette to discharge an ink stain onto the stack of banknotes contained within the cassette. This dyeing of banknotes is highly visible and is designed to alert people that they may receive a dyed banknote that the banknote may have been stolen. [0003] To avoid alerting people that a banknote is stolen, criminals can deposit dyed banknotes into a bank account using an ATM so that no one is present to look at the deposited banknote. [0004] In addition, banknotes can become accidentally dyed, for example, by pouring ink, coffee, or some other liquid. [0005] Banknote issuing authorities (such as the European Central Bank) wish to withdraw dyed banknotes from circulation (regardless of whether these banknotes were dyed as a result of theft, or accidentally dyed), so it is desirable that an ATM can detect dyed bills when such bills are presented to ATMs. [0006] While it is easy for a person to identify tinting on a banknote, it is much more difficult for an automated system because a banknote can be presented in four different orientations, and the use of a single visible light color to produce a banknote image may not. be sufficient to detect dyeing because the dyeing may be the same color as the light source. Summary of the invention [0007] Consequently, the invention generally provides methods, systems, equipment and software for detecting tinting in a media item. [0008] In addition to the Summary of the Invention provided above and the material under study revealed below in the Detailed Description, the following paragraphs of this section are intended to provide an additional basis for alternative claim language for possible use during the execution of this order, if required. If such a request is granted, some aspects may relate to the claims added during the execution of that request, other aspects may relate to the claims made during execution, other aspects may relate to the subject matter being claimed. In addition, the various aspects detailed below are independent of each other, except where otherwise stated. Any claim corresponding to one aspect should not be considered to incorporate any element or feature of the other aspects unless explicitly stated in that claim. [0009] According to a first aspect, a method of detecting tinting in a media item is provided, the method comprising: receiving an image of the media item, where the image comprises a plurality of pixels having different intensity values within a range of intensity values; using pixels from the image having intensity values within a central portion of the intensity value range to create a centrally weighted image; applying a limit to each pixel in the centrally weighted image to transform each pixel into a binary value thereby creating an evaluation image comprising a plurality of pixels, each pixel representing a high or low intensity; calculate a difference image between a binary reference image and the evaluation image by comparing a pixel in the evaluation image with a pixel in the binary reference image in a corresponding spatial location, so that the difference image includes (i) a dye pixel at each spatial location where a pixel in the evaluation image has a low intensity pixel and the corresponding pixel in the binary reference image has a high intensity pixel, and (ii) a non-dye pixel at all other space locations; and indicate that the media item is dyed in the event that the difference image meets a dyeing criterion. [0010] The step of using pixels from the image having intensity values within a central portion of the intensity value range to create a centrally weighted image may comprise contrast stretching of the received image to expand a central portion of the intensity value range of so that the central portion extends through almost the entire range of intensity values. [0011] Alternatively, the step of using pixels from the image having intensity values within a central portion of the intensity value range to create a centrally weighted image may comprise: (i) ignoring pixels having an intensity value below a value low cut, and (ii) ignore pixels that have an intensity value above a high cut value. [0012] Whichever method is used to create a centrally weighted image, the important point is that those pixels that have a very low intensity or a very high intensity are either (a) ignored, or (b) adjusted to match the lowest intensity or the highest intensity, respectively. [0013] The method may comprise the additional step of capturing an image of the media item before the step of receiving an image of the media item. The step of capturing an image of the media item may further comprise capturing a broadcast image of the media item. A transmission image can be captured using an electromagnetic radiation transmitter on one side of the media item and an electromagnetic radiation detector on the opposite side of the media item. In one embodiment, the electromagnetic radiation used is infrared radiation. The use of infrared radiation has the advantage of being independent of the color of any dyeing in the media item. [0014] The step of capturing an image of the media item may include using eight bits to record the intensity value for each pixel (providing a range of intensity values from 0 to 255). Alternatively, any convenient number of bits can be used, such as 16 bits, which would provide a range of intensity values between 0 and 65535. [0015] The method may comprise the additional step of adjusting the spatial dimensions of the received image so that the received image matches the spatial dimensions of the binary reference image. This would compensate for any media items that have missing edge portions, added portions (such as tape) or that have shrunk or expanded, or the like. Techniques for automatically aligning a captured image with a reference image, and then cropping or adding to the captured image to match the special dimensions of the reference image are well known in the art. [0016] The step of contrasting the stretching of the received image to expand a central portion of the intensity value range may comprise X percent saturation at low and high levels of pixel intensity values. X percent can comprise ten percent, five percent, two percent, or any other convenient value. [0017] As known to those skilled in the art, a five percent saturation at both low and high pixel intensity values means that when all pixels in the received image are arranged in the order of pixel intensity, for all pixels that have a pixel intensity lower than the reference intensity (which is the value of five percent from the reference being used) the same minimum pixel intensity value (which can be zero) is assigned; and (ii) all pixels having a pixel intensity higher than the reference intensity (which is the value of 95 percent from the reference being used) are assigned the same maximum pixel intensity value (which can be 255 if eight bits are used for each pixel intensity value). This improves image contrast (by expanding the central portion of the range of intensity values to cover the full available range) and reduces the effects of punctures and other minor anomalies in the media item or image. The reference being used can be the pixels in the received image, or alternatively, the reference being used can be the pixels in an image from which the binary reference image was created. [0018] The step of applying a limit to each pixel in the centrally weighted image can further comprise: ascertaining from a reference image from which the binary reference image was created (i) a limit pixel intensity at which Y percent of all pixels in the reference image have a pixel intensity below the limit pixel intensity, (ii) assign a first binary value (for example, zero) to each pixel in the centrally weighted image having a pixel intensity below or equal to the intensity pixel limit, and (iii) assigning a second binary value (for example, one) to each pixel in the centrally weighted image having a pixel intensity above the limit pixel intensity. The value of Y can be twenty (percent), ten (percent), or any other convenient number. The value of Y selected may depend on the characteristics of the media item (such as transmission characteristics, printing colors used, reflection features, and the like). Using a limit pixel intensity derived from the reference image, a limit that is correct for genuine media items is used; while the centrally weighted image may not be a genuine media item (for example, the displayed media item may be a forgery). [0019] Alternatively, the step of applying a limit to each pixel in the centrally weighted image may further comprise: ascertaining from the centrally weighted image (i) a limit pixel intensity in which Y percent of all pixels in the reference image have an intensity pixel below the limit pixel intensity, (ii) assign a first binary value (for example, zero) to each pixel in the centrally weighted image having a pixel intensity below or equal to the limit pixel intensity, and (iii) assign a second binary value (for example, one) to each pixel in the centrally weighted image having a pixel intensity above the limit pixel intensity. [0020] As an additional alternative, the step of applying a limit to each pixel in the centrally weighted image can further comprise (i) using a predefined limit pixel intensity, (ii) assigning a first binary value (for example, zero) to each pixel to centrally weighted image having a pixel intensity below or equal to the predefined limit pixel intensity, and (iii) assigning a second binary value (for example, one) each pixel in the centrally weighted image having a pixel intensity above the limit pixel intensity preset. [0021] Prior to the step of calculating a difference image, the method may comprise the additional steps of (i) comparing an orientation of the evaluation image with an orientation of the binary reference image, and (ii) where the orientations do not match, implement a transformation geometry of the evaluation image to combine the orientation of the evaluation image with the orientation of the binary reference image. [0022] The geometric transformation may comprise rotating and / or flipping the evaluation image as required. [0023] This reorientation step has the advantage that only one binary reference image is needed (more than four binary reference images, one for each possible media item insertion orientation). This allows the media item to be inserted in any of the four possible orientations. In systems where a media item can be inserted either first long edge or first short edge, then there are eight possible orientations. [0024] The dyeing criterion may comprise the difference image including contiguous dyeing pixels covering an area that exceeds a maximum allowable dyeing area. [0025] The step of indicating that the media item is dyed in the event that the difference image includes contiguous dyeing pixels covering an area exceeding a maximum allowable dyeing area may include verifying whether an area of A mm by B mm includes only pixels of dyeing. For example, if an area of 9 mm by 9 mm includes only dyeing pixels then guidelines from the European Central Bank state that this should be considered as representing a dyed banknote. [0026] Alternatively, the step of indicating that the media item is dyed in the event that the difference image includes contiguous dyeing pixels covering an area that exceeds a maximum allowable ax area may include verifying whether an area of A mm by B mm consists essentially in dyeing pixels. In other words, the media item can be indicated as dyed despite the presence of one or two non-dye pixels in the area of A mm by B mm, where A and B are numbers (whether it is the same number or different numbers). [0027] The method may comprise the additional step of identifying the media item. [0028] The media item may comprise a bank note, a check, a credit transfer, a shipping slip (each of the foregoing being a financial document), or a non-financial media item (such as a label for designer goods or a certificate). [0029] It must be considered that a non-dyeing pixel is populated in the difference image in each spatial location in which a pixel in the evaluation image, or has: (a) a low intensity pixel and the corresponding pixel in the binary reference image has a low intensity pixel, or: (b) a high intensity pixel; and indicates that the media item is dyed in the event that the difference image meets a dyeing criterion. [0030] The binary reference image (and / or the final binary reference image) can be referred to as a non-dyeing template. [0031] According to a second aspect, an operable media validator is provided to detect dyeing in a media item presented to it, the media validator comprising: a media item transport to transport a media item; an image capture device aligned with the transport of the media item and to capture a two-dimensional array of pixels corresponding to the media item, each pixel having a pixel intensity related to a property of the media item in a spatial location on the item of media corresponding to that pixel; and a processor programmed to control the media transport and the image capture device, and also programmed to: receive the two-dimensional array of pixels; centrally consider the two-dimensional array received from pixels; apply a limit to each pixel in the centrally weighted array of pixels to transform each pixel into a binary value thereby creating an evaluation image comprising several pixels, each having one of two possible values; calculate a difference image between a binary reference image and the evaluation image by comparing a pixel in the evaluation image with a pixel in the binary reference image in a corresponding spatial location, so that the difference image includes (i) a dye pixel at each spatial location where a pixel in the evaluation image has a low intensity pixel and the corresponding pixel in the binary reference image has a high intensity pixel, and (ii) a non-dye pixel at all other space locations; and indicate whether the media item is dyed in the event that the difference image meets a dyeing criterion. [0032] The media item transport may comprise one or more endless belts, sliding plates, rollers, and the like. [0033] The image capture device may comprise a two-dimensional sensor, such as a CCD contact image sensor (CIS), which has a sensor area at least as large as the media item area. This allows an entire two-dimensional image to be captured in a moment. Alternatively, the image capture device may comprise a linear sensor (covering one dimension of the media item, but not both dimensions) that captures a strip of the media item when the media item passes through the linear sensor, so that when the entire media item has passed through the linear sensor, then a complete two-dimensional image of the media item can be constructed from the sequence of images captured by the linear sensor. This would make it possible for a lower cost sensor to be used because a smaller detection area (just as large as a dimension of the media item) would be sufficient. [0034] The image capture device may further comprise a light source. The light source may comprise an infrared radiation source. [0035] The image capture device can be located on the opposite side of the media item (the opposite side of the media item path when no media item is present) in relation to the light source so that a broadcast image is captured. Alternatively, but less advantageously, the image capture device can be located on the same side of the media item as the light source so that a reflectance image is captured. [0036] The media validator can comprise a ballot validator. The banknote validator can be incorporated into a media deposit, which can be incorporated into a self-service terminal, such as an ATM. [0037] According to a third aspect, a computer program programmed to implement the steps of the first aspect is provided. [0038] According to a fourth aspect, a method of detecting tingling in a media item is provided, the method comprising: receiving an image of the media item, where the image comprises a plurality of pixels that have different intensity values within a range of intensity values; apply a limit to each pixel in the received image to transform each pixel into a binary value thereby creating an evaluation image comprising several pixels, each having one of two possible values; calculate a difference image between the binary reference image and the evaluation image; and indicate that the media item is dyed in the event that the difference image meets a dyeing criterion. [0039] According to a fifth aspect, a method of creating a binary reference image is provided for use in detecting dyeing in a media item, the method comprising: receive a plurality of images, each image related to a media item of the same type and in a common orientation, and each image comprising several pixels having different intensity values within a range of intensity values, and each pixel corresponding to a location space in the media items; for each spatial location, average the pixel values from the various images to create a single pixel value at that spatial location and thereby create a single average image having a range of intensity values; in contrast to stretch the single average image to expand a central portion of the intensity value range; apply a limit to each pixel in the stretched image in contrast to transform each pixel into a binary value and thereby create a binary reference image comprising several pixels, each having either a high intensity or a low intensity. [0040] The method can comprise the additional step of applying a minimal neighborhood-based filter to the binary reference image to create a final binary reference image. [0041] The step of applying a minimal neighborhood-based filter can comprise the steps of (i) preparing an output matrix having the same dimensions as the final binary reference image, (ii) for each Pig pixel location in the output matrix, examine the NxN neighborhood of the corresponding identical pixel location in the binary reference image, and obtain the lowest intensity value from that neighborhood, then (iii) define that lowest intensity value in Pig in the output matrix. This has the advantage of enlarging the dark areas (low intensity) in each NxN arrangement in the horizontal direction and also in the vertical direction to avoid any errors introduced by printing on the media item, and the like. [0042] Alternatively, any other convenient method for dilating low-intensity pixels can be used. [0043] The definition of a minimal filter based on NxN neighborhood is well known in the art. The NxN arrangement may comprise a 3x3 arrangement, a 4x4 arrangement, a 2x4 arrangement, or any other suitable arrangement size. [0044] According to a sixth aspect, a computer program programmed to implement the steps of the fifth aspect is provided. [0045] For clarity and simplicity of description, not all combinations of elements provided in the aspects mentioned above were expressly exposed. Nevertheless, those skilled in the art will recognize directly and unambiguously that unless it is not technically possible, that is, explicitly stated to the contrary, the consistory clauses that refer to an aspect are intended to apply mutatis mutandis as optional features of all aspects to which those consistory clauses could possibly be related. [0046] These and other aspects will be evident from the following specific description, provided as an example, with reference to the attached drawings. Brief description of the drawings [0047] Figure 1 is a schematic diagram of a tinnitus detection system comprising a media validator coupled to a personal computer (PC), where the system is suitable for implementing a method of detecting tinting in a media item. according to an embodiment of the present invention; Figures 2a to 2c are flowcharts illustrating steps in capturing and processing images for a specific type and orientation of the media item inserted in the media validator in Figure 1 to create a template and non-dye for use in dye detection in a media item; Figures 3a to 3d are pictorial diagrams that illustrate images created in different stages of the non-dyeing template creation process described in Figures 2a to 2c; Figure 4 is a flowchart illustrating steps in detecting the tinting of a media item inserted in the media validation module of Figure 1 using a non-dyeing template created by the template creation process in Figures 2a to 2c; and Figures 5a to 5f are pictorial diagrams that illustrate images created in different stages of the dye detection process described in Figure 4. Detailed Description [0048] Reference is first made to Figure 1, which is a simplified schematic diagram of a dye detection system 10 comprising a media item validator 12 (in the form of a ballot validator); coupled to a personal computer (PC) 14, to implement a method of detecting tinting in a media item in accordance with an embodiment of the present invention. [0049] The ballot validator 12 comprises a housing 13 supporting a transport mechanism 15 in the form of a set of nip rollers comprising upper nip rollers 15a aligned with lower nip rollers 15b, extending from an inlet opening 16 to a capture aperture 18. [0050] The entry and capture openings 16, 18 are in the form of openings defined by the housing 13. [0051] In use, the nip rollers 15a, b guide a short-edge media item (in this embodiment a ballot) 20 through an examination area 22 defined by a gap between adjacent pairs of nip rollers. While the banknote 20 is being transported through the examination area 22, the banknote 20 is selectively illuminated by the light sources, including a lower linear arrangement of infrared LEDs 24 arranged to illuminate through the long edge of the banknote 20. Infrared LEDs 24 are used for transmission measurements. Additional sources of illumination are provided for other functions of the ballot validator 12 (for example, ballot identification, forgery detection, and the like), but these are not relevant to this invention, so they will not be described here. [0052] When the infrared LEDs 24 are illuminated, the emitted infrared radiation is incident on a underside of the ballot 20, and an optical lens 26 focuses the light transmitted through the ballot 20 to the optical image trainer 28 (in this modality an image sensor of (CIS) CCD). This provides an infrared channel output, transmitted from the optical image former 28. In this embodiment, the optical image former 28 comprises an array of elements, each element providing an eight-bit value of detected intensity. The CIS 28 in this modality is a sensor of 200 dots per inch, but the outputs are averaged so that 25 dots per inch are provided. [0053] The light source 24, lens 26, and image trainer 28 comprise an image collection component 30. [0054] The ballot validator 12 includes a power and data interface 32 to allow the ballot validator 12 to transfer data to an external unit, such as an ATM (not shown), a media depot (not shown), or PC 14 , and to receive data, commands, and energy from it. [0055] The ballot validator 12 also has a controller 34 including a digital signal processor (DSP) 36 and an associated memory 38. Controller 34 controls the narrowing rollers 15 and the image collection component 30 (including energizing and de-energizing the lighting source 24). Controller 34 also checks and processes the data captured by the image collection component 30, and communicates that data and / or results of any analysis of that data to the external unit via the data and energy interface 32. Controller 34 receives infrared transmission data from optical image trainer 28. [0056] The ballot validator 12 can be attached to (and also decoupled from) PC 14, as shown in Figure 1. Although in some embodiments, a PC is not required (ballot validator 12 performing all the required data processing and storage) , in this modality PC 14 is used when binary reference images must be created because PC 14 has better processing and data storage than ballot validator 12. Ballot validator 12 can be coupled directly to PC 14, as shown in Figure 1, or indirectly (through a network or an external unit (for example, an ATM)). [0057] PC 14 is a conventional type of PC comprising video 52, memory 54 (in the form of SDRAM), input / output communications 56 (supporting USB standards (for connecting a keyboard, mouse, and the like), Ethernet, and the like ), storage medium 58 (in the form of a hard drive), and a processor (or processors) 60. In addition, PC 14 runs a conventional operating system (not shown and a non-dyeing template creation program 62. [0058] The non-dyeing template creation program 62 receives the data (in the form of captured images of media items) from the ballot validator 12 and processes the data to create non-dyeing templates (also referred to as binary reference images) . These non-dyeing templates (binary reference images) can then be transferred back to ballot validator 12 for use in ascertaining whether or not subsequently introduced media items are dyed. Dye Detection System Operation [0059] The dye detection system 10 can operate in two ways. [0060] The first mode is referred to as the data collection mode. In the data collection mode, multiple media items (in this mode, banknotes) of the same type are fed to the ballot validator 12. Ballot validator 12 captures the images of these banknotes and transfers the images to PC 14 to allow the PC 14 create a non-dyeing template (also referred to as a binary reference image) for that type and orientation of the media item. A typical ballot no-dyeing template can be produced, for example, from a hundred non-ax samples; that is, one hundred different banknotes of the same type, series and orientation (all of them without any dyeing) can be inserted in the ballot validator 12 to create the non-butchery template. The greater the number of samples used, statistically closer to the average will be the non-dyeing template for that type, series, and banknote orientation. [0061] In the second mode, the dye detection system 10 can operate as stated in relation to the dye detection mode. In tint detection mode, ballot validator 12 can be used independently of PC 14. When operating in tint detection mode, ballot validator 12 is typically located in a media depot (not shown) at an ATM ( not shown) or another automated media validation machine. [0062] In the tint detection mode, a single ballot is fed to the ballot validator 12. Ballot validator 12 captures an image of the ballot and creates a binary image from it. The ballot validator 12 then accesses a recognition template to identify the ballot (currency and / or denomination). The ballot validator 12 then accesses a corresponding non-dyeing template that was previously created and is stored locally in the ballot validator 12 and compares the binary image created from the ballot with the non-dyeing template accessed to find out if the ballot is dyed. beyond an acceptable degree. [0063] These two modes of operation will be described in more detail below. [0064] It should be considered that this ballot validator 12 also includes software (encoded in the DSP) to (i) identify the inserted ballot (that is, the specific currency, denomination, series, etc. of the ballot) before testing for if the ballot is dyed; and (ii) validate the ballot when it has been identified and found to be undyed beyond an acceptable degree. Such ballot validation software is known and will not be described in detail here. Ballot validation software may include templates for validating media items, but these validation templates are different from the non-dye templates that are described here. Suitable software and hardware for media validation (including ballot validation) is available through NCR Corporation, 3097 Satellite Blvd., Duluth, GA 30096, United States, which is the assignee of this application. Data Collection Mode for Non-dyeing Template Creation [0065] The operation of the dye detection system 10 will now be described with reference to Figures 2a to 2d, which are flowcharts illustrating the steps involved in creating a non-dyeing template for a specific type and ballot orientation 20. Figure 2a illustrates the steps implemented by PC 14. Figure 2b illustrates the steps implemented by ballot validator 12 in data collection mode, and Figures 2c and 2d illustrate steps implemented by PC 14 in response to data received from ballot validator 12 . [0066] Referring first to Figure 2a, the first step is for the user to launch the non-dyeing template creation program 62 (then "template program") 62 on PC 14 (step 102). features a graphical user interface on video 52 prompting the user to enter information about the media items that will be inserted into the ballot validator 12 (step 104). The information can be selectable from drop-down menus, but includes the ability for a user to enter new information, such information includes currency (for example, US dollars, UK pounds, euros, and the like), denomination (for example, 10, 20, 50 100, 200, 500, 1,000 and similar), the series (for example, 1993 to 1996, 1996 to 2003, or similar), the number of media items in the sample (for example, ten, twenty, fifty, one hundred, thousand, or similar), and similar. combination of: currency, denomination, and compree series nde the class of the media item. A non-dyeing template will be created for each class of media item that ballot validator 12 should receive. [0067] When a user has entered the information then template program 62 converts the entered information into predetermined codes (step 106). For example, US dollars may have the code "USD", a twenty dollar bill may have the code "20", and the like. In this example, the user will insert fifty one hundred euro notes (€ 100) in the orientation of the first short edge with the left edge face up (FULE). [0068] The PC 14 then informs the user, via video 52, to start inserting the banknotes 20, and waits for data transfer from the banknote validator 12 (step 108). [0069] With reference now to Figure 2b, which shows the flow 110 occurring in the ballot validator 12, the first step is for the user to insert the first ballot 20 in a first orientation (in this modality left edge with the face up), which ballot validator 12 receives (step 112). [0070] The controller 34 then transports the bank note 20 to the examination area 22 (step 114) and causes the image collection component 30 to capture an image of the note 20 (transmitted by IR) (step 116). [0071] It must be considered that the image capture process can be used for multiple different purposes. For example, banknotes inserted for use in creating a non-dyeing template can also be used to create an identification template and / or a validation template. Thus, additional channels (that is, additional to the IR-transmitted channel) of information can be captured at that point. In other words, the ballot validator 21 may include other light sources (for example, a green light source), not shown in Figure 1 for clarity. However, these other templates (identification and validation) are not essential to an understanding of this invention, so they will not be described in detail here. It is sufficient for those skilled in the art to realize that the same ballot validator can be used to create multiple different templates for each set of ballots inserted. [0072] Returning to Figure 2b, the image collection component 30 transmits the captured images to controller 34, which transmits the captured images to PC 14 for processing (step 118). [0073] The process then reverts to step 112, in which the user inserts another € 100 banknote. [0074] The processing of the images captured on PC 14 to create a non-dyeing template will now be described with reference to Figure 2c. Figure 2c is a flowchart illustrating the flow of creating non-dyeing template 130 on PC 14. The flow of creating non-dyeing template 130 comprises the steps performed by PC 14 on the images transmitted from the ballot validator 12. [0075] PC 14 receives the images for individual notes 20 from the note validator 12 (step 132) as they have the image formed. Thus, although the ballot validator 12 forms the image of fifty ballots 20 for the non-dyeing template, the ballot validator 12 carries the images to each ballot 12 as soon as the images are captured. [0076] When all images have been received by PC 14, the images are normalized (straightened and then aligned) and adjusted (cut or added) (step 134). Straightening (including edge and / or corner detection), alignment and adjustment of captured images can be implemented by techniques that are known to those skilled in the art. See, for example, United States Patent Application No. 20090324053, which is also in the name of the assignee of the present application. [0077] As a result of the alignment and adjustment step, (i) each image in the image set contains the same number of pixels as each of the other images in the set, and (ii) pixels in an image that refer to a feature on the ballot (for example, the number "2") are located in the same spatial position as the pixels in each other image in the set that refers to that feature. [0078] In this mode, each image comprises a two-dimensional arrangement of approximately 80 pixels by 145 pixels. Each pixel in this array has an intensity value representing the intensity of IR light transmitted through the ballot 20 at that space location. Thus, each pixel in an image represents a spatial location on the ballot corresponding to (and in alignment with) the x and y location of the pixel in the two-dimensional array. [0079] PC 14 then averages all the images in the image set on a pixel-by-pixel basis (step 136) to create an average image. This is implemented by (i) identifying a pixel location, (ii) averaging the pixel intensity values for that pixel location from all images in the image set, (iii) using that intensity value from average pixel to that pixel location to average image, (iv) repeating steps (i) to (iii) until all pixel locations have been created in the average image. A pictorial representation of an average image 200 is shown in Figure 3a, which illustrates an image created by calculating the average of fifty € 100 banknotes inserted in the ballot validator 12. The pictorial representation of Figure 3a was created by transforming the two-dimensional arrangement of pixel numerical intensities from the average pixel image having shades of gray based on the pixel intensities in that average image. [0080] PC 14 then applies contrast stretch to the average image (step 138) to expand a central portion of the pixel intensity value range in the average image. Contrast stretching is a known technique. [0081] In this modality, a saturation of five percent (5%) is applied to the two values of intensity, low and high. This means that all pixels in the average image are arranged in a linear group in the order of pixel intensity (that is, in a one-dimensional arrangement) and the pixel intensity of the pixel at 5% across the linear group is checked. This pixel intensity is then used as a lower limit, such that all those pixels in the average image having an intensity value less than or equal to that lower limit of 5% receive an intensity assignment of "0". Similarly, the pixel intensity of the pixel at 95% across the linear group is assessed. This pixel intensity is then used as an upper limit, so that those pixels in the average image that have an intensity value greater than or equal to that upper limit of 95% are assigned an intensity of "255" (the value highest possible with 8-bit intensity values). Those pixels in the central portion (having an intensity between the lower limit and the upper limit) have their intensities staggered so that the pixel intensities in the central portion now range from "1" to "254". It should be understood that "central portion" refers to pixel intensities, not spatial locations. [0082] Contrast stretching improves image contrast (by expanding the central portion of the intensity range to cover the entire available range) and reduces the effects of punctures and other defects in the banknote. A pictorial representation of the stretched average contrast image 202 is shown in Figure 3b. [0083] PC 14 then creates a preliminary binary reference image from the stretched contrast image (step 140). This is implemented by applying a limit to each pixel in the stretched contrast image to transform each pixel into a binary value. Thus, a binary reference image is created that comprises a plurality of pixels, each having a high intensity (binary "1") or a low intensity (binary "0"). [0084] In this mode, the applied limit is 10% of the dark pixels (as long as this includes at least all pixels to which the intensity of "0" has been assigned). This means that when all the pixels in the stretched contrast image are arranged in order of pixel intensity, (i) the bottom ten percent of pixels (per pixel intensity) are all assigned as low intensity (binary "0") ; and (ii) the highest ninety percent of pixels (per pixel intensity) are all assigned high intensity (binary "1"). A pictorial representation of the preliminary binary reference image 204 is shown in Figure 3c, in which binary pixels "0" are shown as black and binary pixels "1" are shown as white. [0085] PC 14 then creates a non-dyeing template (step 142) by applying a minimal neighborhood-based filter to the preliminary binary reference image to create a final binary reference image. [0086] In this modality, the step of applying a minimal neighborhood-based filter involves preparing a matrix having the desired dimensions (which are the same dimensions as those of the images in the image set because all images have been normalized - see step 134 above). In this mode, the desired dimensions are approximately 80 pixels by 145 pixels. [0087] The value of each pixel location in the matrix is set as the lowest intensity value in the NxN neighborhood (in this 3x3 neighborhood modality) of the corresponding identical pixel location in the preliminary binary reference image using template program 62. As a result, if there is a low intensity (binary "0") in the 3x3 vicinity of a pixel location in the preliminary binary reference image, the same pixel location in the final binary reference image (the matrix) will be set to binary "0". This has the effect of enlarging the dark areas (low intensity) in each 3x3 array in both directions, horizontal and vertical (unless all pixels in that array are already of low intensity). This reduces the effects of any errors introduced by printing on the ballot, and the like. A pictorial representation of the non-dyeing template 206 (the final binary reference image) is shown in Figure 3d. [0088] When the non-dyeing template 206 has been created, it is stored on PC 14, and also transferred to local storage medium (for example, memory 38) in ballot validator 12 (step 144). Associated information (in addition to the binary values that comprise the pixel values in the non-dyeing template) is also stored as part of the non-dyeing template 206. That associated information includes pixel intensity information (that is, the pixel intensities before limit application) for use as a linearization limit, as will be described in more detail below in the dye detection mode. [0089] When all the required non-dyeing templates have been created and stored (in this modality, a non-dyeing template for each denomination to be validated by ballot validator 12), ballot validator 12 can be operated in dye detection mode, as will be described with reference to Figure 4, which shows the 400 flow of steps performed by the ballot validator 12 in the tint detection mode. When operating in the tint detection mode, the ballot validator 12 need not be (and in practical modalities it typically would not be) attached to the PC 14. Dye Detection Mode [0090] Referring now to Figure 4, in the dye detection mode, the user inserts a banknote 200 in any of the four possible short edge orientations first (in this example, left edge face down (FDLE)), the which ballot validator 12 receives (step 412). [0091] The controller 34 then transports the received ballot 20 to the examination area 22 (step 414) and causes the image collection component 30 to capture an image of the ballot (transmitted by IR) (step 416), together with any other images required for other processes (for example, recognition and validation). A pictorial representation of the captured IR image 500 is shown in Figure 5a. [0092] Image collection component 30 transmits the captured IR transmission image to controller 34 (step 418). [0093] Controller 34 includes the same functionality as provided by the non-dyeing template creation program 62 (on PC 14), so that controller 34 normalizes the received image (step 420) in a very similar way to that described with reference to Figure 2c (see step 134). [0094] In practical modalities, detection would be conducted in parallel with ballot identification, ballot validation, and optionally ballot quality assessment, but these other processes are known so that they will not be described here. [0095] Controller 34 then recognizes the banknote (step 421) so that at least the currency and denomination are known (where only one currency is received, only the denomination needs to be identified). This ballot identification (recognition) process can be performed using the standardized image, but in this modality it is performed using a separate image captured by a light source not described here. Suitable techniques for identifying banknotes using a system similar to the equipment in Figure 1 are described in United States Patent Application No. 20090324053, which is also in the name of the assignee of the present application. [0096] Controller 34 then applies contrast stretch to the normalized image (step 422) using a 5% saturation at both high and low pixel intensity values (the 5% values being considered from the average image created in step 136 , which are provided as part of the associated information which is stored in (or with) non-dyeing template 206). This is the same process that was performed in step 138 (Figure 2c). A pictorial representation of the stretched contrast image 502 is shown in Figure 5b. Controller 34 then creates a binary evaluation image from the stretched contrast image (step 424) (using the process described in step 140). This is implemented by applying a limit to each pixel in the stretched contrast image to transform each pixel into a binary value. Thus, a binary evaluation image is created that comprises a plurality of pixels, each having either a high intensity (binary "1") or a low intensity (binary "0"). [0097] In this modality, the applied limit is 10% of the dark pixels from the stretched average contrast image 202 (that is, the image shown in Figure 3b). This is provided as part of the associated information that is stored in (or with) the non-dyeing template 206. The pictorial representation of the binary evaluation image 504 is shown in Figure 5c. [0098] Controller 34 then compares an orientation of the binary evaluation image 504 with an orientation of the non-dyeing template 206 (shown in Figures 3d and 5d) (step 426). [0099] If the orientations do not match, then the binary assessment image 504 needs to be rotated and / or flipped as needed (step 428). In this example, the 206 non-dyeing template was created from banknotes fed using a left edge orientation face up (FULE); whereas, the banknote being evaluated was inserted in the left edge orientation face down (FDLE), so that the binary evaluation image 504 needs to be turned. A pictorial representation of the turned binary evaluation image 508 is shown in Figure 5e. [0100] This reorientation step has the advantage that only one non-dyeing template is required for each denomination series (more properly than four non-dyeing templates, one for each possible ballot insertion orientation). This allows the ballot to be processed regardless of which of the four possible orientations was used to insert the ballot. [0101] If the orientations match (or when the non-match orientation has been oriented correctly using a geometric transformation), the process proceeds to step 430, in which controller 34 calculates an image of difference between the non-dyeing template 206 and the binary evaluation image (reoriented if necessary) 508 (step 430). A representation of the difference image 510 is shown in Figure 5f. [0102] This difference image 510 is calculated by comparing a pixel in the binary (facing) evaluation image 508 with a pixel in the non-dyeing template 206 at a corresponding spatial location. [0103] In this modality, the difference image 510 is populated with a high-intensity (non-dyeing) pixel (binary "1") at each location where the binary (facing) evaluation image 508 has a high-intensity pixel. [0104] Each low-intensity pixel in the binary evaluation image 508 is compared to the corresponding pixel in the non-dyeing template 206. If the non-dyeing template 206 has a low-intensity pixel at that location, then the difference image 510 is populated with a high intensity pixel (not dyed) (binary "1"). If the non-dyeing template 206 has a high intensity pixel at that location, then the difference image 510 is populated with a low intensity pixel (ax) (binary "0"). In other words, only the low-intensity pixels from the binary evaluation image 508 are compared with the corresponding pixels from the non-dyeing template 206 (the high-intensity pixels are all transferred to the difference image 510). Only if the binary evaluation image 508 has a low intensity pixel where the non-dyeing template 206 has a high intensity pixel is the corresponding pixel location in the difference image 510 populated with a low intensity pixel. [0105] In other modalities, the difference image can be calculated using a NAND Boolean function on each pair of pixels (that is, one pixel from the binary evaluation image 508 and the corresponding pixel from the 206 non-dyeing template) . An entry for the NAND function is that of the binary evaluation image pixel values (inverted). The other entry for the NAND function is the pixel values of non-dyeing template (not inverted). The output from the NAND function is only binary "0" (low intensity) if a pixel from the binary evaluation image 508 is binary "0" (low intensity) and the corresponding pixel from the 206 non-dyeing template is torque "1" (high intensity). In other words, the difference image 510 includes a dye pixel at each spatial location, where a pixel in the binary evaluation image 508 has a low intensity pixel; and the corresponding pixel in the non-dyeing template 206 has a high intensity pixel. The difference image 510 also includes a non-dye pixel at each spatial location, where a pixel in the binary evaluation image 508 has either (a) a low intensity pixel; and the corresponding pixel in the non-dyeing template 206 has a low intensity pixel, or (b) a high intensity pixel. [0106] As shown in Figure 5f, each high-intensity pixel (binary "1") (also referred to as a non-dye pixel) in the difference image 510 is illustrated by a white area, and each low-intensity pixel (binary "0" ) (also referred to as a dye pixel) is illustrated by a black area in the ballot 20 image. However, the opposite convention could be used. [0107] It should be noted that the non-dyeing template 206 includes a dark area 512 (Figure 5d) that does not appear in the binary evaluation image 508. This dark area 512 does not appear in the difference image 510 because the binary evaluation image 508 does not have that dark area. [0108] Controller 34 then checks whether the bill meets a dyeing criterion (step 432). [0109] In this modality, the dyeing criterion includes the condition that no high intensity area exceeds a maximum permissible dyeing size. In this modality, if an area of 9 mm by 9 mm includes only dyeing pixels (black areas in Figure 5f) then the ballot is rejected as dyed (step 434). The ballot can be captured by a device on which ballot validator 12 is located; or returned to the customer, depending on the preferences determined by the owner and / or operator of the ballot validator 12. [0110] If there is no high intensity area that exceeds the maximum dyeing size (9 mm by 9 mm in this modality) then ballot 20 is accepted as not dyed (step 436). However, the ballot may be rejected as a forgery, or for some other reason (for example, insufficient quality), as a result of further processing that may be part of the other functions of the ballot validator. [0111] It should be considered that the above modality has significant advantages. For example, it provides a safe method for detecting tinting on a media item. It is also flexible in that the dyeing area required for a media item to be rejected as dyed can be easily updated (enlarged or reduced). It requires only one light source (infrared transmission). Only one orientation is required, regardless of which of the four possible orientations is used to insert the media item. The memory and processing requirements are relatively small, and the process is fast (typically in the order of a few tens of milliseconds) both for generating the non-dyeing template and for testing an inserted media item. [0112] Various modifications can be made in the mode described above within the scope of the invention, for example, in other embodiments the light source 24 may comprise additional light sources, such as an upper green LED source and a lower green LED source, of so that the ballot validator can perform additional functions. [0113] In other embodiments, the dye detection system 10 may include the PC 14. In such modalities, the steps of the non-dyeing template creation flow 130 can be implemented by the ballot validator 12. However, the use of a PC 14 has the advantages of storage and high capacity, high processing performance, and an easy-to-use user interface. [0114] In other embodiments, different media items can be used (for example, check) and media items can be inserted with the long edge first, or otherwise presented (for example, placed in a hopper or receptacle). [0115] In other embodiments, a different dyeing criterion can be applied. [0116] The steps of the methods described herein can be performed in any suitable order, or simultaneously where appropriate. The methods described here can be performed by software in machine-readable form on a tangible storage medium or as a propagation signal. [0117] The terms "comprising", "including", "incorporating", and "having" are used here to quote an unlimited list of one or more elements or steps, not a closed list. When such terms are used, those elements or steps mentioned in the list are not exclusive to other elements or steps that can be added to the list. [0118] Unless otherwise indicated by context, the terms "one" and "one" are used here to denote at least one of the elements, integers, steps, characteristics, operations, or components mentioned later, but do not exclude elements, integers, additional steps, characteristics, operations or components. [0119] The presence of enlarging words and phrases such as "one or more", "at least", "but not limited to" or other similar phrases in some cases does not mean, and should not be taken to mean, that the narrower case it is intended or required in circumstances where such extension phrases are not used.
权利要求:
Claims (14) [0001] Method to detect dyeing in a media item CHARACTERIZED by the fact that it comprises: receiving (418) an image of the media item, where the image comprises a plurality of pixels having different intensity values within a range of intensity values; using (422) pixels from the image that has intensity values within a central portion of the intensity value range to create a centrally weighted image (502); applying (424) a limit to each pixel in the centrally weighted image (502) to transform each pixel into a binary value, thereby creating an evaluation image (504) comprising a plurality of pixels, each having one of two possible values; calculating (430) a difference image (510) between a binary reference image (206) and the evaluation image (504) by comparing a pixel in the evaluation image (504) with a pixel in the binary reference image (206 ) at a corresponding spatial location, so that the difference image includes (i) a dye pixel at each spatial location, in which a pixel in the evaluation image (504) has a low intensity pixel and the corresponding pixel in the image binary reference (206) has a high intensity pixel, and (ii) a non-dye pixel in all other spatial locations; indicate (434) that the media item is dyed in the event that the difference image meets a dyeing criterion (432). [0002] Method according to claim 1, CHARACTERIZED by the fact that the dyeing criterion comprises the difference image (510) without including dyeing pixels covering a maximum dyeing area. [0003] Method according to claim 1 or 2, CHARACTERIZED by the fact that it comprises the additional step of capturing (416) an image of the media item before the step of receiving an image of the media item. [0004] Method according to claim 3, CHARACTERIZED by the fact that the step of capturing an image of the media item additionally comprises capturing a transmission image of the media item using an infrared radiation transmitter (24) on one side of the media item and an infrared radiation detector (28) on the opposite side of the media item. [0005] Method, according to any of the preceding claims, CHARACTERIZED by the fact that the step of using pixels from the image with intensity values within a central portion of the intensity value range to create a centrally weighted image (502), it comprises stretching the contrast of the received image to expand a central portion of the intensity value range so that the centrally weighted image (502) comprises a stretched contrast image. [0006] Method, according to claim 5, CHARACTERIZED by the fact that the step of applying a limit to each pixel in the centrally weighted image (502) additionally comprises: ascertaining from a reference image from which the reference image (206) a limit pixel intensity was created (i) in which Y percent of all pixels in the reference image have a pixel intensity below the limit pixel intensity, (ii) assign a first binary value to each pixel in the centrally weighted image (502) with a pixel intensity below or equal to the limit pixel intensity, and (iii) assigning a second binary value to each pixel in the centrally weighted image (502) with a pixel intensity above the pixel intensity limit. [0007] Method, according to any of the preceding claims, CHARACTERIZED by the fact that before calculating a difference image (510), the method comprises the additional steps of (i) comparing (426) an evaluation image orientation (504) with an orientation of the binary reference image (206), and (ii) where the orientations do not match, implement (428) a geometric transformation of the evaluation image (504) to match the orientation of the binary reference image (206). [0008] Media validator (12) operable to detect dyeing in a media item presented to it, the media validator (12) FEATURED for understanding: a media item transport (15) to transport a media item (20); an image capture device (30) aligned with the transport of the media item (15) and to capture a two-dimensional array of pixels corresponding to the media item (20), each pixel having a pixel intensity related to a property of the item media (20) in a spatial location on the media item (20) corresponding to that pixel; and a processor (34) for controlling the media transport (15) and the image capture device (30), and also for: (i) receiving the two-dimensional array of pixels; (ii) centrally consider the two-dimensional array of pixels received; (iii) applying a limit to each pixel in the centrally weighted array of pixels to transform each pixel into a binary value, thereby creating an evaluation image comprising a plurality of pixels, each having one of two possible values; (iv) calculate a difference image between a binary reference image and the evaluation image by comparing a pixel in the evaluation image with a pixel in the binary reference image at a corresponding spatial location, so that the difference image includes (a) a dye pixel at each spatial location where a pixel in the evaluation image has a low intensity pixel and the corresponding pixel in the binary reference image has a high intensity pixel, and (b) a non-dye pixel in all other space locations; and (v) indicate whether the media item is dyed in the event that the difference image meets a dyeing criterion. [0009] Media validator, according to claim 8, CHARACTERIZED by the fact that the transport of media item (15) comprises one or more endless belts. [0010] Media validator according to claim 8 or 9, CHARACTERIZED by the fact that the image capture device (30) comprises a two-dimensional sensor (28) with a sensor area at least as large as the item area from media. [0011] Media validator according to any one of claims 8 to 10, CHARACTERIZED by the fact that the image capture device (30) additionally comprises an infrared light source (24) located on the opposite side of a path of media items for an infrared detector (28). [0012] Media validator according to any of claims 8 to 11, CHARACTERIZED by the fact that the media validator comprises a ballot validator. [0013] Media deposit, CHARACTERIZED by the fact that it comprises a media validator (12) as defined in any of claims 8 to 12. [0014] Media deposit, according to claim 13, CHARACTERIZED by the fact that the media deposit includes a banknote storage area and a check storage area.
类似技术:
公开号 | 公开日 | 专利标题 BR102012023646B1|2020-12-01|method, media validator and media deposit to detect tinting on a media item US9704031B2|2017-07-11|Media identification JP5314419B2|2013-10-16|Bill authenticity judging method and bill authenticity judging device US7513413B2|2009-04-07|Correlation of suspect currency note received by ATM to the note depositor US20050169511A1|2005-08-04|Document processing system using primary and secondary pictorial image comparison EP1489562B1|2006-10-25|System and method for tracing bank notes US8983168B2|2015-03-17|System and method of categorising defects in a media item JP2010117803A|2010-05-27|Paper sheet processor and paper sheet processing program RU2562758C2|2015-09-10|Method and apparatus for determining reference data set of class for classification of valuable documents CN104881811B|2020-01-17|Management method, system and device for electronization of bill information US9460345B2|2016-10-04|Apparatus and method for recognizing media, financial device US20100155463A1|2010-06-24|Fraudulent document detection system and method Garain et al.2008|On automatic authenticity verification of printed security documents Baek et al.2018|Detection of counterfeit banknotes using multispectral images CN106340115B|2020-09-15|Method and device for identifying serial number of paper money TWI378406B|2012-12-01|Method for performing color analysis operation on image corresponding to monetary banknote CN108986296B|2021-08-13|Media security verification US10438436B2|2019-10-08|Method and system for detecting staining JP2008217518A|2008-09-18|Medium identifier TWI378405B|2012-12-01|Method for performing currency value analysis operation US20200294342A1|2020-09-17|Systems and methods for detection of counterfeit documents KR101544754B1|2015-08-18|System for verifying and managing forged or altered checks based on web and the method thereof JP2018045566A|2018-03-22|Valuable medium detector, valuable medium detection system, and valuable medium detection method CN109074695A|2018-12-21|Device and method for checking the Echtheit of Security element CN107615302A|2018-01-19|Predefined text data is read from paper
同族专利:
公开号 | 公开日 CN103366358B|2017-06-13| CN103366358A|2013-10-23| EP2645339B1|2015-03-04| BR102012023646A2|2013-11-19| US8805025B2|2014-08-12| EP2645339A1|2013-10-02| US20130259301A1|2013-10-03|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 GB2361765A|2000-04-28|2001-10-31|Ncr Int Inc|Media validation by diffusely reflected light| GB0106817D0|2001-03-19|2001-05-09|Rue De Int Ltd|Monitoring method| JP3669698B2|2002-09-20|2005-07-13|日東電工株式会社|Inspection method and inspection apparatus for printed matter| EP1434176A1|2002-12-27|2004-06-30|Mars, Incorporated|Banknote validator| JP4472260B2|2003-02-07|2010-06-02|日本ボールドウィン株式会社|Printing surface inspection method| US8606013B2|2006-08-31|2013-12-10|Glory Ltd.|Paper sheet identification device and paper sheet identification method| JP5174513B2|2008-04-03|2013-04-03|グローリー株式会社|Paper sheet stain detection apparatus and stain detection method| US8682056B2|2008-06-30|2014-03-25|Ncr Corporation|Media identification| US8577117B2|2008-06-30|2013-11-05|Ncr Corporation|Evaluating soiling of a media item| JP2011028512A|2009-07-24|2011-02-10|Toshiba Corp|Method for creating dictionary for fitness determination of paper sheet, paper sheet processing apparatus, and paper sheet processing method|US10073044B2|2014-05-16|2018-09-11|Ncr Corporation|Scanner automatic dirty/clean window detection| CN104376573B|2014-12-03|2017-12-26|歌尔股份有限公司|A kind of image smear detection method and system| CN104376574B|2014-12-03|2017-08-18|歌尔股份有限公司|A kind of image smear measuring method and system| GB2542558B|2015-09-17|2018-12-05|Spinnaker Int Ltd|Method and system for detecting staining| US9626596B1|2016-01-04|2017-04-18|Bank Of America Corporation|Image variation engine| GB2548546A|2016-02-18|2017-09-27|Checkprint Ltd|Method and apparatus for detection of document tampering| US10275971B2|2016-04-22|2019-04-30|Ncr Corporation|Image correction| DE102016011417A1|2016-09-22|2018-03-22|Giesecke+Devrient Currency Technology Gmbh|Method and device for detecting color deterioration on a value document, in particular a banknote, and value-document processing system| JP6801434B2|2016-12-20|2020-12-16|富士通株式会社|Bioimage processing device, bioimage processing method and bioimage processing program| KR20180081356A|2017-01-06|2018-07-16|삼성전자주식회사|Method for processing distortion of fingerprint image and apparatus thereof| US10212356B1|2017-05-31|2019-02-19|Snap Inc.|Parallel high dynamic exposure range sensor| US10834283B2|2018-01-05|2020-11-10|Datamax-O'neil Corporation|Methods, apparatuses, and systems for detecting printing defects and contaminated components of a printer| US10546160B2|2018-01-05|2020-01-28|Datamax-O'neil Corporation|Methods, apparatuses, and systems for providing print quality feedback and controlling print quality of machine-readable indicia| US10795618B2|2018-01-05|2020-10-06|Datamax-O'neil Corporation|Methods, apparatuses, and systems for verifying printed image and improving print quality| US10803264B2|2018-01-05|2020-10-13|Datamax-O'neil Corporation|Method, apparatus, and system for characterizing an optical system| CN108346149B|2018-03-02|2021-03-12|北京郁金香伙伴科技有限公司|Image detection and processing method and device and terminal|
法律状态:
2013-11-19| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]| 2018-12-11| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-10-08| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-05-26| B06A| Notification to applicant to reply to the report for non-patentability or inadequacy of the application [chapter 6.1 patent gazette]| 2020-11-03| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2020-12-01| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 19/09/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US13/436,078|US8805025B2|2012-03-30|2012-03-30|Stain detection| US13/436,078|2012-03-30| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|